skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Buhayh, Anas"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Fairness in recommender systems is a complex concept, involving multiple definitions, different parties for whom fairness is sought, and various scopes over which fairness might be measured. Re- searchers seeking fairness-aware systems have derived a variety of solutions, usually highly tailored to specific choices along each of these dimensions, and typically aimed at tackling a single fairness concern, i.e., a single definition for a specific stakeholder group and measurement scope. However, in practical contexts, there are a multiplicity of fairness concerns within a given recommendation application and solutions limited to a single dimension are therefore less useful. We explore a general solution to recommender system fairness using social choice methods to integrate multiple hetero- geneous definitions. In this paper, we extend group-fairness results from prior research to provider-side individual fairness, demon- strating in multiple datasets that both individual and group fairness objectives can be integrated and optimized jointly. We identify both synergies and tensions among different objectives with individ- ual fairness correlated with group fairness for some groups and anti-correlated with others. 
    more » « less
    Free, publicly-accessible full text available September 7, 2026
  2. Recommender systems have a variety of stakeholders. Applying concepts of fairness in such systems requires attention to stakeholders’ complex and often-conflicting needs. Since fairness is socially constructed, there are numerous definitions, both in the social science and machine learning literatures. Still, it is rare for machine learning researchers to develop their metrics in close consideration of their social context. More often, standard definitions are adopted and assumed to be applicable across contexts and stakeholders. Our research starts with a recommendation context and then seeks to understand the breadth of the fairness considerations of associated stakeholders. In this paper, we report on the results of a semi-structured interview study with 23 employees who work for the Kiva microlending platform. We characterize the many different ways in which they enact and strive toward fairness for microlending recommendations in their own work, uncover the ways in which these different enactments of fairness are in tension with each other, and identify how stakeholders are differentially prioritized. Finally, we reflect on the implications of this study for future research and for the design of multistakeholder recommender systems. 
    more » « less